Before we talk about eigenvalues and eigenvectors let us just remind ourselves that vectors can be transformed using matrices. For example we can rotate a vector using the rotation matrix:
[cosθsinθ−sinθcosθ][xy]=[x′y′]
Or we can use a matrix to scale a vector:
[100010][510]=[50100]
Now let us go back to eigenvalues and eigenvectors. An eigenvector v of a square matrix A is defined as a non-zero vector such that the multiplication with A only changes the scale of the vector it does not change the direction. The scalar λ is called the eigenvalue.
Av=λv
Because there would be an infinite amount of solutions we limit the magnitude of the vector to ∥v∥2=1.
Let us look at an example of how to calculate the eigenvector and eigenvalue of
A=[0−21−3]
For this we can rewrite the problem and solve the following equations:
Av=λvAv−λv=0Av−λIv=0(A−λI)v=0
For there to be a solution where v is non-zero then the following must be true and which then must lead to the characteristic polynomial of A. Solving the characteristic polynomial equaling 0 we can get between 0 and n eigenvalues with n being the number of dimensions of A∈Rn×n:
As presented in this video by 3Blue1Brown there is a cool formula that can be used to calculate the eigenvalues of a 2×2 matrix such as A=[acbd]. It rests upon two properties that have already been mentioned above:
The trace of A is the sum of its eigenvalues tr(A)=∑i=1nλi. So in other words a+d=λ1+λ2. We can also reform this to get the mean value of the two eigenvalues: 21tr(A)=2a+d=2λ1+λ2=m
The determinant of A is the product of its eigenvalues det(A)=∏i=1nλi. So in other words ad−bc=λ1⋅λ2=p
The eigendecomposition is a way to split up square matrices into 3 matrices which can be useful in many applications. Eigendecomposition can be pretty easily derived from the above since it lead to the following equations:
Instead of holding this information in three separate equations we can combine them to one equation using matrices. We combine the eigenvectors to a matrix where each column is a eigenvector and we create a diagonal matrix with the eigenvalues (by convention in order of small to large):
If A is a symmetric matrix then Q is guaranteed to be an orthogonal matrix because it is the eigenvectors of A concatenated. Because Q is orthogonal Q−1=QT which leads to the formula being simplified to